Mutual relationship between Entropies

نویسندگان

  • Angel Garrido
  • Janos Neu
چکیده

Our paper analyzes some new lines to advance on an evolving concept, the so-called Entropy. We need to model this measure by adequate conditions, departing from vague pieces of information. For this, it will be very necessary to analyze the relationship between some such measures, which may be of di¤erent types, with their very interesting applications, as Graph Entropy, Metric Entropy, Algorithmic Entropy, Quantum Entropy, and Topological Entropy. Keywords: Measure Theory, Fuzzy Measures, Symmetry, Entropy. Mathematics Subject Classi…cation: 68R10, 68R05, 05C78, 78M35. 1. Introduction The study of di¤erent concepts of Entropy will be very interesting now, and not only on Physics, but also on Information Theory [27] and other Mathematical Sciences, considered in its more general vision [6, 44]. Also may be a very useful tool on Biocomputing, for instance, or in many others, as studying Environmental Sciences. Because, among other interpretations, with important practical consequences, the law of Entropy means that energy cannot be fully recycled. Many quotations are made until now referring to the content and signi…cance of this fuzzy measure. Between them, we pick up "Gain in Entropy always means loss of Information, and nothing more" (G. N. Lewis). "Information is just known Entropy. Entropy is just unknown Information" (M. P. Frank, Physical Limits of Computing). Mutual Information and Relative Entropy , also called Kullback-Leibler divergence, among other related concepts [6, 11-14, 17], have been very useful in Learning Systems, both on supervised and on unsupervised cases. Our paper attempt to analyze the mutual relationship between the distinct types of entropies, as the Quantum Entropy, also called Von Neumann Entropy; 1AMO Advanced Modeling and Optimization. ISSN: 1841-4311

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fundamental properties on Tsallis entropies

A chain rule and a subadditivity for the entropy of type β, which is one of the nonextensive (nonadditive) entropies, were derived by Z.Daróczy. In this paper, we study the further relations among Tsallis type entropies which are typical nonextensive entropies. We show some inequalities related to Tsallis entropies, especially the strong subadditivity for Tsallis type entropies and the subaddit...

متن کامل

Fast computation of entropies and mutual information for multispectral images

This paper describes the fast computation, and some applications, of entropies and mutual information for color and multispectral images. It is based on the compact coding and fast processing of multidimensional histograms for digital images.

متن کامل

Quantum measurements and entropic bounds on information transmission

While a positive operator valued measure gives the probabilities in a quantum measurement, an instrument gives both the probabilities and the a posteriori states. By interpreting the instrument as a quantum channel and by using the monotonicity theorem for relative entropies many bounds on the classical information extracted in a quantum measurement are obtained in a unified manner. In particul...

متن کامل

Nonparametric Information Theoretic Clustering Algorithm

In this paper we propose a novel clustering algorithm based on maximizing the mutual information between data points and clusters. Unlike previous methods, we neither assume the data are given in terms of distributions nor impose any parametric model on the within-cluster distribution. Instead, we utilize a non-parametric estimation of the average cluster entropies and search for a clustering t...

متن کامل

Normalized Measures of Mutual Information with General Definitions of Entropy for Multimodal Image Registration

Mutual information (MI) was introduced for use in multimodal image registration over a decade ago [1,2,3,4]. The MI between two images is based on their marginal and joint/conditional entropies. The most common versions of entropy used to compute MI are the Shannon and differential entropies; however, many other definitions of entropy have been proposed as competitors. In this article, we show ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009